Appendix: Sharing Features in Multi-class Boosting via Group Sparsity

نویسندگان

  • Sakrapee Paisitkriangkrai
  • Chunhua Shen
  • Anton van den Hengel
چکیده

In this document we provide a complete derivation for multi-class boosting with group sparsity and a full explanation of admm algorithm presented in the main paper. 1 Multi-class boosting with group sparsity We first provide the derivation for multi-class logistic loss with 1,2-norm. We then show the difference between our boosting with 1,2-norm and 1,∞-norm. We then briefly discuss our group sparsity-based boosting for any general convex loss.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Direct Approach to Multi-class Boosting and Extensions

Boosting methods combine a set of moderately accurate weak learners to form a highly accurate predictor. Despite the practical importance of multi-class boosting, it has received far less attention than its binary counterpart. In this work, we propose a fully-corrective multi-class boosting formulation which directly solves the multi-class problem without dividing it into multiple binary classi...

متن کامل

Multi Class Learning with Individual Sparsity

Multi class problems are everywhere. Given an input the goal is to predict one of a few possible classes. Most previous work reduced learning to minimizing the empirical loss over some training set and an additional regularization term, prompting simple models or some other prior knowledge. Many learning regularizations promote sparsity, that is, small models or small number of features, as per...

متن کامل

An Efficient Threshold Verifiable Multi-Secret Sharing Scheme Using Generalized Jacobian of Elliptic Curves

‎In a (t,n)-threshold secret sharing scheme‎, ‎a secret s is distributed among n participants such that any group of t or more participants can reconstruct the secret together‎, ‎but no group of fewer than t participants can do‎. In this paper, we propose a verifiable (t,n)-threshold multi-secret sharing scheme based on Shao and Cao‎, ‎and the intractability of the elliptic curve discrete logar...

متن کامل

Simple Training of Dependency Parsers via Structured Boosting

Recently, significant progress has been made on learning structured predictors via coordinated training algorithms such as conditional random fields and maximum margin Markov networks. Unfortunately, these techniques are based on specialized training algorithms, are complex to implement, and expensive to run. We present a much simpler approach to training structured predictors by applying a boo...

متن کامل

Robust Multi-View Boosting with Priors

Many learning tasks for computer vision problems can be described by multiple views or multiple features. These views can be exploited in order to learn from unlabeled data, a.k.a. “multi-view learning”. In these methods, usually the classifiers iteratively label each other a subset of the unlabeled data and ignore the rest. In this work, we propose a new multi-view boosting algorithm that, unl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012